![]()
专利摘要:
The present invention relates to a mobile terminal (100) and its control method. The present invention relates to a mobile terminal (100) and its control method for allowing a user to easily access a desired screen or application by means of a drag entry starting at an angle of a touch screen (151) and slid in a diagonal direction. 公开号:FR3029309A1 申请号:FR1554420 申请日:2015-05-18 公开日:2016-06-03 发明作者:Donghwan Yu;Seojin Lee;Samsick Kim 申请人:LG Electronics Inc; IPC主号:
专利说明:
[0001] MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME This application claims priority for the Korean patent application No. 10-2014-0167706 filed November 27, 2014 in Korea whose complete contents are incorporated herein by reference in its entirety. The present invention relates to a mobile terminal, configured to be used considering the convenience of the user, and its control method. Generally terminals can be classified as mobile / portable terminals or stationary terminals depending on their mobility. Mobile terminals can also be classified as portable terminals or terminals embedded in vehicles depending on whether a user can directly or not carry the terminal. Mobile terminals have received more and more functions. Examples of such functions include data communications and voice communications, image and video capture via a camera, recording of audio contents, recording of audio contents, playback of music files via a system speakers and displaying images and video on a display device. Some mobile terminals include additional functionality that supports games while other terminals are configured as media players. More recently, mobile terminals have been configured to receive nonselective / selective broadcast signals that can display content such as videos and television programs. There are ongoing efforts to support and increase the functionality of mobile terminals. Such efforts include software and hardware enhancements as well as changes and improvements to the structural components that make up the mobile terminal. Therefore, the object of the present invention is to address the aforementioned problems and other problems. Another object of the present invention is a mobile terminal, configured to allow a user to gain faster access to a desired function by entering a predetermined pattern of tactile touch while a display is off, and a control method of said terminal. [0002] Another object of the present invention is a mobile terminal, configured to quickly capture an image at a desired instant by automatic camera operation according to a predetermined drag entry starting at an angle of a touch screen while display is off, and a method of controlling said terminal. To accomplish the aforementioned objects or other objects of the present invention, a mobile terminal according to one aspect of the present invention comprises: a body; a camera ; a touch screen disposed on the front face of the body and having a plurality of angles; and a controller configured to operate the camera to capture an image upon receipt of a first drag input applied at a first angle of the touch screen and dragged to the center of the touch screen. The first drag entry can be received while the touch screen is off. The controller may be configured to operate the camera to capture an image when the first drag entry is released. [0003] The first drag entry may have a sliding path to the center of the touch screen and include a plurality of discontinuous slip inputs applied along the slip path, and the controller may be configured to control the image to be captured each time that the first drag entry is discontinuous. [0004] The controller may be configured to operate a first camera disposed on the front of the body to capture an image in response to the first drag input when the first angle corresponds to a top corner of the touch screen and to operate a second camera disposed on the back side of the body to capture an image in response to the first drag input when the first angle corresponds to a lower angle of the touch screen. When the first drag entry is received while a preview image is displayed on the touch screen as a result of camera operation, the controller may be configured to control the mobile terminal to enter a dual camera mode. an activation of both a first camera disposed on the front side of the body and a second camera disposed on the rear face of the body. The controller may be configured to control whether a received notification message is present when receiving a second drag input applied at a second angle of the touch screen and dragged to the center of the touch screen and when the message is received. received notification is present, to display one or more application icons relating to the notification message received along a slip path of the second drag entry. [0005] The notification message may be a notification message about an application installed in the mobile terminal. Icon macaroons corresponding to the notification message can be added to the application icons and displayed. An angle among the first and second angles may correspond to one of the right and left angles of the touch screen and the other angle is the other angle of the touch screen. The received notification message may include the number of unread text messages and / or the number of unanswered calls and / or application update information. [0006] The controller may be configured to execute a first application corresponding to a point at which the second drag entry is released upon releasing the second drag entry. When the second drag entry is received during the execution of the first application, the controller may be configured to execute a second application displayed along the slip path. When the second drag entry is held at a specific point on the slip path for a predetermined time, the controller may be configured to maintain the display of one or more application icons. The controller may be configured to display at least one newly captured image on a generated window in response to the second drag entry when the received notification message is not present and to execute a gallery application upon releasing the second entry. slipping. The controller may be configured to display information relating to a specific application on the touch screen upon receipt of the first input drag while a screen for executing the specific application is displayed on the touch screen. The controller may be configured to display a lock screen corresponding to a lock mode on the touch screen and, when the first slip input is received while the lock screen is displayed, to define an image captured by a touch operation. the camera as a background image of the lock screen. The controller can be configured to display the background image set as the background image of the lock screen. When a third drag input applied to a third angle of the touch screen and dragged to the center of the touch screen is received, the controller may be configured to display one or more application icons along a path of the touch screen. sliding of the third entrance slip. [0007] The third angle may correspond to a lower left corner and a lower right corner of the touch screen. The controller may be configured to sequentially display newly executed applications along a path of the third drag entry in response to the third drag entry. [0008] A method of controlling a mobile terminal according to another aspect of the present invention comprises: receiving a first drag input applied at a first angle of a touch screen having a plurality of angles and slid toward the center of the 'touchscreen ; and actuating a camera to capture an image upon releasing the first drag input. [0009] Further scope of the applicability of the present application will become more apparent upon reading the detailed description below. However, it should be understood that the detailed description and the specific examples, although they indicate preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications in the spirit and scope of the invention will become apparent to those skilled in the art from the detailed description. Other features and advantages of the invention will emerge more clearly from a reading of the following description, made with reference to the appended drawings, which are given by way of illustration only, which in no way limit the present invention and on which FIG. 1A is a block diagram of a mobile terminal according to the present disclosure; FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, viewed in different directions; Fig. 2 is a flowchart of a method for controlling a mobile terminal and for explaining the concept applied to embodiments of this disclosure; Fig. 3 is a view for explaining the angles of a touch screen applied to the embodiments of this disclosure; Fig. 4 is a flowchart illustrating a method of controlling a mobile terminal according to a first embodiment of this disclosure; Figures 5 to 14 are views for explaining an exemplary implementation of the control method of a mobile terminal according to the first embodiment of this disclosure; Fig. 15 is a flowchart illustrating a method of controlling a mobile terminal according to a second embodiment of this disclosure; Figs. 16A-19D are views for explaining an exemplary implementation of the method of controlling a mobile terminal according to the second embodiment of this disclosure; Fig. 20 is a flowchart illustrating a method of controlling a mobile terminal according to a third embodiment of this disclosure; Figs. 21A-23B are views for explaining an exemplary implementation for the method of controlling a mobile terminal according to the third embodiment of this disclosure; Fig. 24 is a flowchart illustrating a method of controlling a mobile terminal according to a fourth embodiment of this disclosure; Figs. 25A-26C are views for explaining an exemplary implementation of the control method of a mobile terminal according to the fourth embodiment of this disclosure; and Figures 27 to 28B are views for explaining an exemplary implementation of a mobile terminal control method according to a fifth embodiment of the present disclosure. The present invention will be described above in detail according to embodiments given here by way of example and with reference to the accompanying drawings. [0010] The present invention will be described above in detail according to embodiments given by way of example and with reference to the accompanying drawings. In the interest of having a brief description with reference to the drawings, identical or equivalent components may carry the same reference numbers and their description will not be repeated. In general, a generic term such as "module" and "unit" can be used to refer to elements or components. Such a generic term is used here primarily with the intention of facilitating the description of the specification, and the generic term itself is not mentioned with the intention of giving a special meaning or function. In this disclosure, what is well known to those skilled in the art has generally been omitted for the sake of brevity. The accompanying drawings serve to facilitate understanding of the various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. Thus, the present statement is to be interpreted as an extension to all equivalents, modifications and substitutes in addition to those which are particularly shown in the accompanying drawings. It must be understood that although the terms first, second, and so on. can be used here to describe various elements, these elements should not be limited by these terms. These terms are generally used only to distinguish one element from another. It should be understood that when one element is referenced as "connected to" another element, the element may be connected to the other element or intervening elements may also be present. On the contrary, when an element is referenced as being "directly connected to" another element, there are no intervening elements present. A single representation may have a multiple representation unless it represents a meaning that is definitely different from the context. Terms such as "to have" or "to have" are used herein and it should be understood that they are used with the intention of indicating the existence of various components, functions or steps, presented in this memo, and it is also necessary to understand that more or fewer components, functions or steps can be likely used. [0011] Mobile terminals presented here can be implemented using a large number of different types of terminals. Examples of such terminals include cell phones, smartphones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, portable computers (PC), electronic slates, electronic tablets, electronic books, portable devices (eg smart watches, electronic glasses, facial displays (HMD)) and the like. [0012] By means of a non-limiting example only, the description will be made with reference to particular types of mobile terminals. However, such teachings also apply to other types of terminals, such as the types mentioned above. In addition, these teachings can also be applied to stationary terminals, such as digital TVs, desktops and similar devices. Reference will now be made to FIGS. 1A to 1C, of which FIG. 1A is a block diagram of a mobile terminal according to the present disclosure, while FIGS. 1B and 1C are conceptual views of an example of the mobile terminal, seen according to FIG. different directions. [0013] The mobile terminal 100 is presented with components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180 and a power supply unit 190. It should be understood that the implementation of all the illustrated components is not an obligation and that, as a variant, more or fewer components can be implemented. Referring now to FIG. 1A, the mobile terminal 100 is presented with a wireless communication unit 110 configured with several components implemented in common. For example, the wireless communication unit 110 typically includes one or more components that enable wireless communication between the mobile terminal 100 and a wireless communication system or network in which the mobile terminal is located. The wireless communication unit 110 typically includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 typically includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 comprises one or more of the following modules: a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short communication module scope 114 and a location information module 115. [0014] The input unit 120 includes a camera 121 for obtaining images or videos, a microphone 122 which is a type of audio input device for inputting an audio signal and a user input unit 123 ( for example, a touch key, a push button, a mechanical key, a soft key and similar keys) allowing the user to enter information. Data (e.g., audio, video, image, etc.) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user commands, and combinations of them. The detection unit 140 is typically implemented using one or more sensors configured to detect internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and other similar information. For example, in FIG. 1A, the detection unit 140 is shown provided with a proximity sensor 141 and a lighting sensor 142. If desired, the detection unit 140 may alternatively or additionally comprise other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a motion sensor, an RGB sensor, an infrared sensor (W ), a fingerprint sensor, an ultrasonic sensor, an optical sensor (for example, the camera 121), a microphone 122, a battery meter, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal probe, and a gas sensor, among other sensors) and a chemical sensor (e.g., an electronic nose, a health care sensor, a biometric sensor and sensors similar) to name a few. The mobile terminal 100 may be configured to use the information obtained from the detection unit 140 and, in particular, the information obtained from one or more sensors of the detection unit 140 and combinations thereof. [0015] Output unit 150 is typically configured to output various types of information, such as audio, video, touch, etc. outputs. The output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153 and an optical output module 154. The display unit 151 may have a multilayer structure or integrated structure with a touch sensor to make a touch screen possible. The touch screen may provide an output interface between the mobile terminal 100 and a user, and may also function as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. user. The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, may have wired or wireless ports, and / or external power ports and / or wired or wireless data ports, and / or memory card ports and / or connection ports of a device having an identification module and / or input ports audio / I / O and / or video I / O ports and / or earphone ports and / or similar ports. In some cases, the mobile terminal 100 may perform matching control functions, associated with a connected external device, in response to the external device connected to the interface unit 160. The memory 170 is typically implemented to store data in order to to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, etc. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed in the mobile terminal 100 at the time of its manufacture or shipment, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call, making a call, receiving a message, sending a message, etc.). It is common for application programs to be stored in the memory 170, to be installed in the mobile terminal 100 and to be executed by the controller 180 to perform an operation (or function) for the mobile terminal. 100. [0016] The controller 180 typically operates to control all operations of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 can provide or process appropriate information or functions for a user by processing signals, data, information, etc. which are input or output by the various components shown in Fig. 1A, or by activation of the application programs stored in the memory 170. For example, the controller 180 controls some or all of the components shown in Figs. 1A to 1C according to executing an application program that has been stored in the memory 170. The power supply unit 190 may be configured to receive external power or provide internal power to provide a proper power required to operate. elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery and the battery may be configured to be integrated into the terminal body or may be configured to be detachable from the terminal body. [0017] Still with reference to Figure 1A, various components shown in this figure will now be described in more detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or information associated with a broadcast of an external broadcast management entity via a channel of diffusion. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to simultaneously facilitate the reception of two or more broadcast channels, or to support switching between broadcast channels. [0018] The mobile communication module 112 may transmit wireless signals to and / or receive from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and so on. Such network entities form a part of a mobile communication network, which is built according to technical standards or communication methods for mobile communications (for example: global mobile telecommunication system (GSM), multiple access code division (CDMA), code division multiple access code 2000 (CDMA2000), evolution or optimized data only (EV-D0), code division multiple access code (WCDMA), high speed downlink access in packet mode (HSDPA), high-speed packet mode uplink (HSUPA), long-term evolution (LTE), advanced long-term evolution (LTE-A) and similar processes). Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio call signals, video (telephony) call signals, or various data formats to support message communication. textual and multimedia. The wireless Internet module 113 is configured to facilitate wireless access to the Internet. This module can be coupled internally or externally to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals via communication networks according to wireless Internet technologies. Examples of such wireless access to the Internet include Wireless Local Area Networks (WLAN), Wi-Fi, Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (VViBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Packet Access (HSDPA), High Speed Packet Access (HSUPA), Long Term Evolution (LTE), Advanced Long Term Evolution (LTE-A) and Process Access Similar. The wireless Internet module 113 can transmit / receive data according to one or more of said wireless Internet technologies and also according to other Internet technologies. In some embodiments, when wireless access to the Internet is implemented according to, for example WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A, etc., as part of a mobile communication network, the wireless Internet module 113 achieves such wireless access to the Internet. Thus, the Internet module 113 can cooperate with the mobile communication module 112 or operate as a mobile communication module 112. The short-range communication module 114 is configured to make possible short-range communications. Appropriate technologies for implementing such short-range communications include BLUETOOTHTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi, Wi -Fi Direct, Wireless USB, etc. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal. and a network in which another mobile terminal 100 (or an external server) is located via wireless networks. An example of wireless networks is a wireless personal network. In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a portable device, for example, a smart watch, electronic glasses or a face display (HMD), which is capable of to exchange data with the mobile terminal 100 (or else to cooperate with the mobile terminal 100). The short-range communication module 114 can detect or recognize the portable device and allow communication between the portable device and the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100 , the controller 180 may, for example, cause the transmission of data processed in the mobile terminal 100 to the portable device via the short-range communication module 114. Thus, a user of the portable device can use the portable device the data For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. In addition, when a message is received in the mobile terminal 100, the user can check the received message using the portable device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. By way of example, the location information module 115 comprises a GPS module (global positioning system), a Wi-Fi module or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. [0019] For example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent by a GPS satellite. In another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information relating to a wireless access point (AP) that transmits a wireless signal to the Wi-Fi module. -Fi or receive one from the latter. The input unit 120 may be configured to allow various types of inputs to the mobile terminal 120. Examples of such inputs include audio content inputs, image and video inputs, data inputs, and data inputs. user inputs. An image and video input is often obtained using one or more cameras 121. Such cameras 121 can process still or moving image frames obtained by image sensors in a capture mode. image or video. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to allow the mobile terminal 100 to enter a plurality of locations. images having various angles or focal points. In another example, the cameras 121 may be placed in a stereoscopic arrangement to acquire left and right images for implementation in a stereoscopic image. [0020] The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input can be processed in a variety of ways according to a function being performed in the mobile tenainal 100. If desired, the microphone 122 may include noise canceling algorithms for suppressing unwanted noises generated during the reception of external audio signals. The user input unit 123 is a component that allows input by a user. Such user input may enable the controller 180 to control an operation of the mobile terminal 100. The user input unit 123 may include one or more mechanical input elements (for example, a key, a button located on a front and / or rear surface or on a side surface of the mobile terminal 100, a dome switch, a thumb wheel, a push button, etc.) or a touch input, among other elements. For example, the touch input may be a virtual key or softkey, which is displayed on a touch screen through software processing, or a touch key that is located on the handheld at a location which is other than the touch screen. On the other hand, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, a graphic, a text, an icon, a video or a combination thereof. The detection unit 140 is generally configured to detect one or more of the internal information of the mobile terminal, information of the surrounding environment of the mobile terminal, user information or similar information. The controller 180 generally cooperates with the transmitting unit 140 to control an operation of the mobile terminal 100 or to execute a data processing, a function or an operation associated with an application program installed in the mobile terminal on the basis of the detection provided by the detection unit 140. The detection unit 140 can be implemented using any one of a large number of sensors some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface or object located near a surface, using an electromagnetic field , infrared rays or similar means without mechanical contact. The proximity sensor 141 may be arranged in an internal region of the mobile terminal covered by the touch screen or near the touch screen. The proximity sensor 141, for example, may comprise a transmitter-type photoelectric sensor or a direct reflection type photoelectric sensor, a mirror reflection type of photoelectric sensor, a high frequency oscillation proximity sensor, a reflection sensor. capacitive type proximity, a magnetic type proximity sensor, an infrared proximity sensor and similar sensors When the implemented touch screen is capacitive type, the proximity sensor 141 can detect a proximity of a pointer relative to the touch screen by variations of an electromagnetic field that is sensitive to an approach of an object having a conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor. [0021] The term "proximity touch" will often be referenced here to refer to the scenario in which a pointer is positioned to be close to the touch screen without being in contact with the touch screen. The term "contact touch" will often be referenced here to refer to the scenario in which a pointer produces physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a proximity touch and proximity touch parameters (e.g., distance, direction, speed, time, position, state of motion, etc.). In general, the controller 180 processes data corresponding to the proximity touch and proximity touch parameters detected by the proximity sensor 141, and causes visual information to be output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or to process different data depending on whether a touch relative to a point on the touch screen is either a proximity touch or a touch contact. A touch sensor may detect a touch applied to the touch screen, such as a display unit 151, using any of a variety of touch input methods. Examples of such touch input methods include a resistive type, a capacitive type, an infrared type and a magnetic type, among other types. For example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance appearing at a specific portion of the display unit 151. , in electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also a contact pressure and / or a contact capacitance. A contact object is typically used to apply a touch input to the touch sensor. Examples of typical contact objects include a finger, a pencil, a stylus, a pointer, or a similar object. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals and then transmit corresponding data to the controller 180. Accordingly, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be a separate component of the controller 180, the controller 180, and combinations of both. [0022] In some embodiments, the controller 180 may execute the same or different commands depending on a type of contact object that touches the touch screen or a type of touch key provided in addition to the touch screen. The execution of the same command or a different command depending on the object which provides a touch input can be decided, for example, on the basis of a current state of operation of the mobile terminal 100 or a program of application currently executed. The touch sensor and the proximity sensor can be implemented, individually or in combination, to detect various types of touch. Such touches include a short touch (or small tap), a long touch, a multiple touch, a slipped touch, a touch with a tap, a pinch, a touch away, a sweeping touch, a hovering touch, etc. . If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a contact object using ultrasonic waves. The controller 180, for example, can calculate a position of a wave generating source based on the information detected by a light sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time it takes the light to reach the optical sensor is much shorter than the time it takes the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generator source can be calculated using this fact. For example, the position of the wave generating source can be calculated using the time difference from the time the ultrasonic wave makes to reach the sensor based on the light serving as a reference signal. The camera 121 typically includes at least one camera sensor (CCD, CMOS, etc.), a photographic sensor (or image sensors) and a laser sensor. The implementation of the camera 121 with a laser sensor may allow detection of a contact of a physical object with respect to a 3D stereoscopic image. The photographic sensor may be laminated to the display or partially covering the display. The photographic sensor may be configured to analyze motion of the physical object near the touch screen. In more detail, the photographic sensor may comprise lines and columns of photodiodes and phototransistors for analyzing a content received in the photographic sensor by means of an electrical signal which varies according to the amount of light applied. More precisely, the photographic sensor can calculate the coordinates of the physical object according to a variation of light so as to obtain position information of the physical object. [0023] The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program. executing on the mobile terminal 100 or user interface (UI) information and graphical user interface (GUI) information in response to the information for the execution screen. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display system such as a stereoscopic system (a system with glasses), an auto-stereoscopic system (a system without glasses), a projection system (holographic system) or similar systems. The audio output module 152 is generally configured to output audio data. Such audio data can be obtained from any one of a number of different sources so that the audio data can be received from the wireless communication unit 110 or may have been stored in the memory. The audio data can be output during modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode and similar modes. The audio output module 152 may provide an audible output relating to a particular function (e.g., a call signal receiving tone, a message receiving tone, etc.) performed by the mobile terminal 100. The output module Audio 152 may also be implemented as a receiver, loudspeaker, buzzer or similar device. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives or otherwise experiences. A typical example of a tactile effect generated by the haptic module 153 is a vibration. Intensity, shape, etc. the vibration generated by the haptic module 153 can be controlled by a user selection or by a controller setting. For example, the haptic module 153 may emit different vibrations in a combined manner or in a sequential manner. In addition to a vibration, the haptic module 153 can generate various other tactile effects, including a stimulating effect such as an arrangement of vertically movable pins to touch the skin, a jet force or suction force. air through a jet orifice or suction opening, contact on the skin, electrode contact, electrostatic force, effect by reproducing the feeling of cold and heat using an element that can absorb or generate heat, etc. The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscular sensation such as the fingers or the arm of the user, as well as to transfer the tactile effect by the user. through direct contact. Two or more haptic modules 153 may be present depending on the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate generation of an event using light from a light source. Examples of events generated in the mobile terminal 100 may include receiving a message, receiving a call signal, a missed call, an alarm, a scheduling deadline, an e-mail reception, receiving information through an application, etc. A signal outputted from the optical output module 154 may be implemented in such a manner that the mobile terminal emits monochromatic light or light with a plurality of colors. The output signal can be stopped, for example, when the mobile terminal detects that a user has controlled the generated event. [0024] The interface unit 160 serves as an interface for external devices to connect to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted by an external device, receive energy to be transferred to elements and components in the mobile terminal 100, or transmit internal data from the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone-microphone ports, external power ports, wired or wireless data ports, memory card ports, ports for connecting a device having a identification module, input / output (I / O) ports, video I / O ports, headphone ports, or similar ports. The identification module may be a chip that stores various information for authenticating authorization to use the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM) , a universal subscriber identity module (USEVI) and similar modules. In addition, the device having the identification module (also referred to herein as "identification device") may take the form of a smart card. As a result, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external support, the interface unit 160 can be used as a gateway to allow the energy from the medium can be supplied to the mobile terminal 100 or it can be used as a gateway to allow various control signals entered by the user to be transferred from the bearer to the mobile terminal through the control unit. interface. Various control signals or energy input from the medium may function as signals to recognize that the mobile terminal is properly mounted on the medium. The memory 170 may store programs to perform controller 180 operations and store input / output data (eg, a phone book, messages, still images, videos, etc.). The memory 170 can store data relating to various forms of vibration and audio sounds that are output in response to touch inputs on the touch screen. The memory 170 may comprise one or more types of storage media comprising a flash memory, a hard disk, an electronic disk, a silicon disk, a micro type of multimedia card, a card-type memory (for example, SD memory or DX, etc.), a random access memory (RAM), a static memory (SRAM), a read only memory (ROM), an erasable and electrically programmable read only memory (EEPROM), a programmable read only memory (PROM), a magnetic memory, a magnetic disk, an optical disk and similar memory devices The mobile terminal 100 may also be used in connection with a network storage device which performs the storage function of the memory 170 in a network such as the Internet. The controller 180 may typically control the general operations of the mobile terminal 100. For example, the controller 180 may enable or disable a lockout state to prevent a user from entering a control command for applications when a mobile terminal state satisfies a predefined condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, etc., or perform pattern recognition processing to recognize a handwritten input or a handwritten input. entering drawings of an image made on the touch screen in the form of characters or images, respectively. In addition, the controller 180 may control one of these components or a combination of these components to implement various embodiments given herein by way of example. The power supply unit 190 receives external energy or has internal energy and provides the appropriate energy required to operate respective elements and components included in the mobile terminal 100. The power supply unit 190 can have a battery that is typically rechargeable or can be detachably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port may be configured as one of the examples of the interface unit 160 to which an external charger is electrically connected to provide power for recharging the battery. In another example, the power supply unit 190 may be configured to recharge the battery in a wireless manner without using the connection port. In this example, the power supply unit 190 can receive power, transferred from an external wireless power transmitter, using at least one inductive coupling method which is based on the magnetic induction and / or a magnetic resonance coupling method which is based on electromagnetic resonance. [0025] Various embodiments described herein may be implemented on a computer readable medium, on a machine readable medium or on a similar medium using, for example, software components, hardware components or any combination thereof. Referring now to Figures 1B and 1C, the mobile terminal 100 is described with reference to a bar-type terminal body. However, the mobile terminal 100 may alternatively be implemented in any of a large number of different configurations. Examples of such configurations include a watch type, a clip type, an eyewear type or as a folding type, a folding type, a sliding type, an oscillating type and a pivoting type in which two or more bodies are combined with each other. with others in a relatively mobile way, and combinations of these types. The description will often relate to a particular type of mobile terminal (for example, a bar type, a watch type, a glasses type, etc.). However, such teachings regarding a particular type of mobile terminal will generally also be applied to other types of mobile terminals. The mobile terminal 100 will generally include a housing (for example, frame, box, cover, etc.) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a space formed between the front housing 101 and the rear housing 102. At least a central housing may be further positioned between the front housing 101 and the rear housing 102. The display unit 151 is shown located on the front side of the terminal body for outputting information. As shown in the figure, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, components The electronics may also be mounted on the back box 102. Examples of such electronic components include a detachable battery 191, an identification module, a memory card, etc. A rear cover 103 is shown covering the electronic components, and this cover can be releasably coupled to the rear housing 102. Therefore, when the rear cover 103 is detached from the rear housing 102, the electronic components mounted on the rear housing 102 are uncovered. As shown in the figure, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially uncovered. In some cases, during pairing, the back box 102 may also be fully shielded by the back cover 103. In some embodiments, the back cover 103 may include an opening for exposing a camera 121b or audio output module. 152b. The housings 101, 102, 103 may be formed of injection-molded synthetic resin or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti) or a similar metal. [0026] As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be configured such that a housing forms the internal space. In this example, a mobile terminal 100 having a single body is formed in such a manner that the synthetic resin or metal extends from a side surface to a back surface. [0027] If desired, the mobile terminal 100 may include a watertight unit (not shown) to prevent the ingress of water into the body of the terminal. For example, the watertight unit may include a watertight member that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102. and the back cover 103 for hermetically sealing an internal space when these housings are assembled. Figures 1B and 1C show some components as they are arranged on the mobile terminal. However, it should be understood that other arrangements are possible and remain in the teachings of this disclosure. Some components may be omitted or arranged differently. For example, the first handling unit 123a may be located on another surface of the terminal body and the second audio output module 152b may be located on the side surface of the terminal body. [0028] The display unit 151 outputs information processed in the mobile terminal 100. The display unit 151 may be implemented using one or more appropriate display devices. Examples of such suitable display devices include a liquid crystal display (LCD), a thin film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED) flexible display, a three-dimensional display ( 3D), an electronic ink display and combinations of these displays. The display unit 151 may be implemented using two display devices which may each implement an identical or different display technology. For example, a plurality of display units 151 may be arranged on one side only, spaced from one another, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor that detects a touch input received on the display unit. When a touch is made on the display unit 151, the touch sensor can be configured to detect that touch and the controller 180, for example, can generate a control command or other signal corresponding to the touch. The content that is inputted in a tactile manner can be a text or numeric value, or a menu item that can be indicated or designated in various ways. [0029] The touch sensor can be configured as a film having a tactile structure, disposed between the window 151a and a display on a rear surface of the window 151a, or in the form of a wire which is distributed directly over the window. rear surface of the window 151a. Alternatively, the touch sensor can be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or within the display. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1A). Therefore, the touch screen can replace at least some of the functions of the first handling unit 123a. [0030] The first audio output module 152a may be implemented as a speaker to output voice audio sounds, alarm sounds, multimedia audio reproduction, and the like. [0031] The window 151a of the display unit 151 will typically include a port for passing audio sounds generated by the first audio output module 152a. One variation is to allow the audio sounds to be scattered along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be visible or otherwise hidden in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be configured to output a light indicating the generation of an event. Examples of such events include message reception, call waiting reception, missed call, alarm, scheduling deadline, e-mail reception, receipt of information through an application, etc. When a user has controlled a generated event, the controller can control the optical output unit 154 to stop the light output. [0032] The first camera 121a can process image frames such as still or moving images obtained by the image sensor in a capture mode or in a video calling mode. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. The first and second manipulation units 123a and 123b are examples of the user input unit 123 and may be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b may also be referenced in common as a manipulative part and may employ any tactile method that allows the user to realize manipulation such as contact, pressure, scrolling, etc. The first and second handling units 123a and 123b can also employ any non-tactile method that allows the user to perform manipulation such as proximity touch, flyby, etc. Figure 1B illustrates the first handling unit 123a as a touch key, but possible variants include a mechanical key, a push button, a touch key, and combinations of these keys. An input received on the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a may be used by the user to provide menu entry, home key, cancel, search, etc., and the second handling unit 123b may be used by the user. user to provide an input for controlling a volume level outputted from the first or second audio output module 152a or 152b to switch to a touch recognition mode of the display unit 151, or a similar function In another As an example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a number of different ways. For example, the rear input unit may be used by the user to provide an input for start / stop, start, end, scroll, control volume level output of the first or second audio output module 152a or 152b to switch to a touch recognition mode of the display unit 151, etc. The rear input unit can be configured to allow touch input, pressure input, or combinations of these inputs. The rear input unit may be located to partially cover the display unit 151 on the front side in a direction of the thickness of the terminal body. By way of example, the rear input unit may be located on an upper end portion of the rear face of the terminal body so that a user can easily manipulate it with his index finger when the user holds the terminal body with one hand. Alternatively, the rear input unit can be positioned at almost any position on the back side of the terminal body. Embodiments that include the rear input unit may implement some or all of the functionality of the first handling unit 123a in the rear input unit. Thus, in situations where the first handling unit 123a is omitted from the front panel, the display unit 151 may have a larger screen. In another variant, the mobile terminal 100 may include a fingerprint sensor that analyzes a fingerprint of a user. The controller 180 can then use the fingerprint information detected by the fingerprint sensor as part of an authentication procedure. The fingerprint sensor may also be installed in the display unit 151 or implemented in the user input unit 123. [0033] The microphone 122 is shown being located at one end of the mobile terminal 100, but other locations are possible. If desired, multiple microphones can be implemented, such an arrangement for receiving stereophonic sounds. [0034] The interface unit 160 may serve as a path allowing the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may have one or more connection terminals for connecting to another device (for example, a listener, an external speaker, etc.), a port for near-field communication. (For example, an IrDA port, a Bluetooth port, a wireless LAN port, etc.), or a power supply terminal for powering the mobile terminal 100. The interface unit 160 may be implemented under the form of a connection interface for housing an external card, such as a subscriber identification module (KM), a user identity module (UIM) or a memory card for storing information. [0035] The second camera 121b is shown on the back side of the terminal body and has an image capture direction that is substantially opposite to the image capture direction of the first camera unit 121a. If desired, the second camera 121a may alternatively be located at other locations or may be made movable to have an image capture direction different from that shown. The second camera 12 lb may include a plurality of lenses arranged along a line. The plurality of lenses may also be arranged in a matrix configuration. The cameras can be referenced as "network camera". When the second 12 lb camera is implemented as a network camera, images can be captured in various ways using the plurality of lenses and the images are of better quality. As can be seen in FIG. 1C, a flash 124 is shown adjacent to the second camera 121b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject. [0036] As can be seen in FIG. 1B, the second audio output module 152b may be located on the body of the terminal. The second audio output module 152b may implement stereophonic sound functions in conjunction with the first audio output module 152a, and may also be used for the implementation of a speakerphone mode for telephone communications. At least one antenna for wireless communication may be located on the body of the terminal. The antenna may be installed in the body of the terminal or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the body of the terminal. Alternatively, an antenna may be formed with a film attached to an inner surface of the back cover 103, or with a housing that includes a conductive material. A power supply unit 190 for supplying power to the mobile terminal 100 may include a battery 191 which is mounted in the terminal body or detachably coupled to an outer portion of the terminal body. The battery 191 can receive power via a power source cable connected to the interface unit 160. In addition, the battery 191 can be recharged in a wireless manner using a charger wireless. Wireless charging can be implemented by magnetic induction or electromagnetic resonance. The rear cover 103 is shown coupled to the rear housing 102 to shield the battery 191, to prevent detachment of the battery 191 and to protect the battery 191 against external impact or against foreign material. When the battery 191 is detachable from the body of the terminal, the rear housing 103 can be releasably coupled to the rear housing 102. An accessory for protecting an aspect or for assisting or extending the functions of the mobile terminal 100 may also be provided on the mobile terminal 100. An exemplary accessory may be a cover or a case for covering or housing at least one surface of the mobile terminal 100. The cover or case may cooperate with the display unit 151 for extending the functions of the mobile terminal 100. Another example of an accessory is a touch pen to assist or extend a touch input for a touch screen. Other preferred embodiments will be described in more detail with reference to additional figures. It should be understood by those skilled in the art that the present characteristic functions can be realized in several forms without departing from the features of the present invention. Fig. 2 is a flow diagram of a mobile terminal control method for explaining the concept applied to the embodiments of the present invention, and Fig. 3 is a view for explaining touch screen angles applied to embodiments of the present invention. The controller 180 of the mobile terminal 100 according to one embodiment of the present invention receives a touch input applied from an angle of the touch screen 151 and slid toward the center of the touch screen 151 (in a diagonal direction) (S10). The touch input can be a drag entry. According to one embodiment of the present invention, different functions can be performed at angles of the touch screen 151 at which the drag entry starts. [0037] The controller 180 may receive a drag input starting at a first angle of the touch screen 151 (S11). In this case, the controller 180 can automatically execute a camera application (S12). The controller 180 may receive a drag input starting at a second angle of the touch screen 151 (S13). In this case, the controller 180 may display a predetermined list of applications on the touch screen 151 (S14). Here, an angle between the first angle and the second angle may correspond to one of the upper left angle and the upper right angle of the touch screen 151 and the other angle may correspond to the other angle of the However, the present invention is not limited thereto. The present invention may include any angle of the touch screen 151 for the above operation. The angles of the touch screen 151 to which a drag entry starts will now be described in detail with reference to FIG. 3. With reference to FIG. 3, the touch screen 151 of the mobile terminal 100 may be disposed on the front face of the body terminal. The touch screen 151 may comprise a plurality of edges 151a, 151b, 151c and 151d. The first edge 151a may be the right edge of the touch screen 151, the second edge 151b may be the upper edge of the touch screen 151, the third edge 151c may be the left edge of the touch screen 151 and the fourth 151d edge may be the lower edge of the touch screen 151. The edges 151a, 151b, 151c and 151d can form a touch screen 151 of rectangular shape. However, the shape of the touch screen 151, formed by the edges 151a, 151b, 151c and 151d, is not limited to the rectangular shape and can be changed in various ways. [0038] In the present invention, a point at which two of the edges 151a, 151b, 151e and 151d meet is called an angle. For example, a point at which the first edge 151a and the second edge 151b meet is called a first angle 151_C 1. The first angle 151_C 1 corresponds to a top right corner of the touch screen 151. A point at which the second edge 151b and the third edge 151e meet is called a second angle 151_C2. The second angle 151_C2 corresponds to a top left corner of the touch screen 151. A point at which the third edge 151e and the fourth edge 151d meet is called a third angle 151_C3. The third angle 151_C3 corresponds to a lower left corner of the touch screen 151. A point at which the fourth edge 151d and the first edge 151a meet is called a fourth angle 151_C4. The fourth angle 151_C4 corresponds to a lower right corner of the touch screen 151. According to one embodiment of the present invention, each corner may comprise at least one frame portion B in addition to the portion of the touch screen 151 That is, when the controller 180 detects a touch input applied at the first angle 151_C 1, the controller 180 can recognize the slip input starting at the framing region B as a drag input starting at the first angle. angle. [0039] Although a point where two edges meet is described as an angle in the above description, the angles may correspond to 12 o'clock, 3 o'clock, 6 o'clock and 9 o'clock points when the display unit of the mobile terminal 100 has a circular shape (for example, a mobile terminal of the watch type). FIG. 4 is a flowchart illustrating a method of controlling a mobile terminal according to a first embodiment of the present invention and FIGS. 5 to 14 are views for explaining an exemplary implementation of the control method of a mobile terminal. mobile terminal according to the first embodiment of the present invention. The control method of a mobile terminal according to the first embodiment of the present invention can be implemented in the mobile terminal 100 described with reference to FIGS. 1A to 1C. The method of controlling a mobile terminal according to the first embodiment of the present invention and the operations of the mobile terminal 100 will be described below in order to implement the method with reference to the appended drawings. With reference to FIG. 4, the controller 180 may receive a first drag input slid from the first angle of the touch screen 151 to the center of the touch screen 151 (S110) while the display unit is off (S100 : YES). It can be understood that the state in which the display unit is off is a state in which the mobile terminal 100 is in a lock mode. The lock mode of the mobile terminal 100 can be classified into two modes. [0040] The first lock mode corresponds to a case in which power is not supplied to the touch screen 151 and thus no information is available through the touch screen 151. The second lock mode corresponds to a case in which power is supplied to the touch screen 151 to allow the provision of predetermined information via the touch screen 151 and the lock mode can be canceled by manipulation applied to the touch screen 151 touch screen 151 or by a predetermined manipulation. The first lock mode can be switched to the second lock mode or canceled according to a predetermined manipulation. [0041] In the first embodiment of the present invention, the 'display off' state may be based on operation of the mobile terminal 100 in the first lock mode. The present invention may also be applied to a case in which the mobile terminal 100 operates in the aforementioned second lock mode. Examples of such an operation will be described in a fourth embodiment (FIGS. 24 to 26C). Figs. 5 and 6 are views for explaining an example of receiving a slip input starting at an angle of the display (touch screen) when the display is off. Referring to Fig. 5, the first angle corresponds to the upper right corner of the touch screen 151 and may include a portion of the frame, as described above. [0042] Here, the direction towards the center can refer to a direction towards the center of the touch screen 151 from the first angle C1. Therefore, when the first sliding entry is slid towards the center of the touch screen 151, the first drag entry can be dragged to the lower left corner of the touch screen 151. Here, sliding toward the center of the touch screen 151 does not mean a passage through the center point of the touch screen 151. By for example, the drag operation may include a first-angle touch operation and touch-sliding in a diagonal direction within a predetermined range based on the first angle. [0043] Referring to Fig. 6, when the first angle C1 of the touch screen 15 is touched and then the touch is held for a predetermined time, the controller 180 can display a diagonal direction range within which a drag entry will be applied to perform an operation according to an embodiment of the present invention. Therefore, a user can easily perform the operation according to one embodiment of the present invention by guiding for the sliding direction. Referring back to FIG. 4, the controller 180 controls whether the first drag input applied in a predetermined diagonal direction is released at a specific point (S120). [0044] The controller 180 may execute a camera application and control the mobile terminal 100 to capture an image when the first drag entry is released at a point on the slip path (S131). Here, the controller 180 can control the camera to operate automatically to capture a frontal image when the first drag entry is released without further selection of a camera application icon while the display unit 151 of the terminal mobile 100 is off. The mobile terminal 100 according to one embodiment of the present invention may comprise the first camera (121a of Figure lb) disposed on the front face of the body of said terminal and the second camera (121b of Figure 1c) disposed on the rear face said terminal. Therefore, the second camera 121b can operate to capture a user-perceived front image when releasing the first drag input. The first camera 121a can also be operated according to the first drag entry. [0045] The camera operated according to the first drag input can be preset by the user. When the camera is operating to capture an image, the controller 180 can control the display unit 151 to turn on automatically and display the captured image R on the touch screen 151. According to one embodiment of the In the present invention, since the camera application is automatically executed according to the first drag input, the execution screen of the camera application may be displayed on the touch screen 151 after the image has been captured. Therefore, a preview image P obtained through the camera and the captured image R according to the first drag input can be displayed together on the touch screen 151. According to an embodiment of the present invention, the first input Drag can be applied according to various models. The controller 180 can define different image capture methods according to models of the first drag input. Referring again to Figure 4, the first drag entry may have a slip path and include a plurality of discontinuous slip inputs applied to the slip path. The controller 180 may receive a plurality of discontinuous slip inputs (S141). [0046] The first drag entry can be applied from the first angle (upper right corner) to the third corner (lower left corner). That is, the controller 180 can operate the camera to capture an image once the first drag entry starting at the first angle is completed in the third angle. [0047] With reference to FIG. 7, however, discontinuous slip inputs D1, D2 and D3 from the first angle C1 to the third angle C3 may be applied. A drag input applied from the first angle to the center of the touch screen may be temporarily discontinuous at a first point Ti. The drag can be extended from the first point T1 to the center of the touch screen, with the touch of a finger of a user remaining on the first touch point T1, and temporarily discontinuous at the second point T2. In the same way, the drag can be extended from the second point T2 to the third angle. The drag pattern shown in Fig. 7 is called a plurality of discontinuous slip entries in the present invention. [0048] Referring back to FIG. 4, the controller 180 may capture an image through the camera each time a slip input starting from the first angle is discontinuous while the display unit 151 is turned off (S143). As a result, three discontinuous slip inputs are applied in the case of FIG. 7 and thus three C11, C12 and C13 images can be captured. Referring to Fig. 8, at the end of the plurality of discontinuous slide entries in a diagonal direction, the controller 180 may display the last image C13 among the plurality of captured images and a preview image P of the camera on the screen. 151 touch screen. [0049] Referring to FIG. 9, according to an embodiment of the present invention, upon receiving an input to select the captured image C13, one or more sharing applications (mail, Messenger, Drive and Bluetooth) to share the image captured with an external device may be displayed on the touch screen 151. The controller 180 may transmit the captured image C13 to the external device using an application selected from one or more sharing applications. The example of actuating the camera using a diagonal slip input starting at the first angle while the display unit is off in accordance with an embodiment of the present invention can be modified and implemented. [0050] For example, the rear camera (121b of Fig. 1c) can be operated with a drag entry starting at the first angle, while the front camera (121a of Fig. 1b) can be operated with a diagonal slip entry starting at the corresponding fourth angle. at the lower right corner of the touch screen 151. With reference to FIG. 10a, the controller 180 can receive a slip input starting at the fourth angle C4 and slid toward the center of the touch screen 151. The controller 180 can capture images C21, C22 and C23 through the front camera (121a of Fig. 1b) whenever the drag entry starting at the fourth angle C4 and slid in a diagonal direction is discontinuous. Referring to Figure 10b, the controller 180 may display on the touch screen 151 the last captured image C23 and a preview image P acquired through the actuated camera. According to one embodiment of the present invention, burst shooting can be performed through a drag entry. [0051] Referring again to FIG. 4, when the first drag entry starting at the first angle is held at a point on the slip path for a predetermined time, the controller 180 can take a burst of shot during the maintenance of the first angle. slip input (S151). [0052] The controller 180 can take continuous shooting at a predetermined speed while maintaining the first drag input. With reference to FIG. 11a, the controller 180 can capture images C21, C22, C23, Cn through continuous shooting when the first drag entry slid from the first angle C1 in a diagonal direction is applied and 10 maintained at a first point T1 on the slip path. With reference to FIG. 1 lb, the controller 180 can simultaneously display on the touch screen 151 a thumbnail image of the last image Cn among the images C21, C22, C23, Cn captured by means of continuous shooting. and a preview image P on the touch screen 151. According to the present invention, it is possible to capture images more quickly using only a diagonal drag input applied to the touch screen 151 in case of urgency even when the display unit is off. In addition, photography according to various models can be made using different input models slip in case of emergency. According to one embodiment of the present invention, when the first diagonal slip input starting at the first angle is reapplied after the execution of the camera application, the function of the camera to capture an image can be changed. With reference to FIG. 12, when the camera is operated at the first input slid while the display unit 151 is off, the first slider input starting at the first angle can be re-applied according to one embodiment of the present invention. present invention. That is, when the first drag entry is received during a run of a first camera application, the controller 180 may execute a second CA camera application having a different function than the first camera application. The second camera application may be an application that can capture images using a predetermined filtering function. [0053] Referring to Fig. 13, upon receipt of the first diagonal slip input starting from the first angle during execution of the camera application, the controller 180 may control the camera function of the mobile terminal 100 to is executed in a two-camera mode. In Fig. 13, when the first drag input is received during actuation of the rear camera (12 lb of Fig. 1c), the front camera (121a of Fig. 1b) is operated and a preview image P2 captured by the camera front end (121a of FIG. 1b) and a preview image P1 captured by the rear camera (121b of FIG. 1c) can be simultaneously displayed on the touch screen 151. [0054] That is, it is possible to easily enable the two camera mode through a predetermined drag entry even if a predetermined menu button to activate the two camera mode is not displayed on a camera. preview image according to an embodiment of the present invention. Referring to Fig. 14, the controller 180 may execute the camera application upon receipt of the first diagonal slip input starting at the first angle during the display of a predetermined image G1 on the touch screen 151 according to an application of gallery. Here, the controller 180 can control the camera application to run in the dual mode and control the gallery image G1 to be displayed on a P2 screen in dual camera mode. Fig. 15 is a flowchart illustrating a method of controlling a mobile terminal according to a second embodiment of the present invention, and Figs. 16A to 19D are views for explaining an exemplary implementation of the control method of a mobile terminal. mobile terminal according to the second embodiment of the present invention. The method of controlling a mobile terminal according to the second embodiment of the present invention can be implemented in the mobile terminal 100 described above with reference to Figures 1A to 1C. The method of controlling a mobile terminal according to the second embodiment of the present invention and operations of the mobile terminal 100 to implement the method will be described below with reference to the accompanying drawings. The second embodiment of the present invention may be implemented based on the above-mentioned first embodiment. In addition, the second embodiment may be combined with at least a portion of the first embodiment. Referring to Fig. 15, the controller 180 may receive a second diagonal slip input starting at the second angle (S220) while the display unit 151 of the mobile terminal 100 is off (S210). The second angle may correspond to a position on the touch screen 141, which is different from that of the aforementioned first angle. For example, when the first angle corresponds to the upper right corner of the touch screen 151, the second angle may correspond to a top left corner of the touch screen 151. [0055] The second drag entry may be a drag entry starting at the second angle and sliding in a diagonal direction. Upon receipt of the second drag entry, the controller 180 may control whether a received notification message is present (S230). When the notification message is present, the controller 180 can display at least one application corresponding to the notification message on the touch screen 151 along the sliding path of the second input slip when receiving the second input slip (S240 ). With reference to FIG. 15, the controller 180 controls whether the second drag entry is released. When the second drag entry is released (S250: YES), the application displayed at a position at which the drag entry is released can be executed (S260). With reference to FIG. 16a, upon receipt of the second diagonal slip input starting at the second angle C2 corresponding to the upper left corner of the touch screen 151, the controller 180 can display applications A1, A2 and A3 having notification messages along the path of the second drag entry. The Al, A2 and A3 applications can be displayed with icon macros Ni, N2 and N3 respectively assigned to the applications. With reference to FIG. 16a, the first application A1 may be a Messenger application, the second application A2 may be a calling application and the third application A3 may be a message application. The controller 180 may display the first, second and third applications in the generation order of the notification message. That is, the application corresponding to the last generated notification message may be the first application Al. [0056] The notification messages received are notification messages for applications installed and executed in the mobile terminal 100 and correspond to data received from external devices for specific applications. Notification messages received may be data that has not been read by a user. For example, the applications installed and running in the mobile terminal may include a call application, a text message application, a Messenger application, and similar applications. A notification message regarding the call application may be a message indicating the presence of an unanswered call. A notification message regarding the text message application or the Messenger application may be an unread text message. A notification message may include update information of a predetermined application. Icon macarons corresponding to the notification messages respectively can be attached to the applications and displayed. Icon macaroons can be provided in the form of a number indicating the number of unconfirmed pieces of information. With reference to FIG. 16b, the controller 180 may display an execution screen A111 of the first application A1 on the touch screen 151 when the second input slip is released at a point corresponding to the third application A3. Since the first application Al is the Messenger application, the execution screen of the Messenger application can be displayed on the touch screen 151 when releasing the second entry slip. That is, it is possible to directly enter the Messenger application through a predetermined drag entry while the display unit is turned off without performing a process of selecting the icon of the display. Messenger application to run the Messenger application. Referring to Fig. 17, when the second diagonal slip input starting at the second angle is received again while the first application's Ail run screen (Messenger application) is displayed on the touch screen 151, the controller 180 can execute the second application A2 corresponding to the notification message generated immediately before the notification message corresponding to the first application A1 and the display of an execution screen A21 of the second application A2. Here, the second application can be a call application. With reference to FIG. 18a, the controller 180 may retain the display of the first, second, and third applications A1, A2, and A3 displayed in response to the second drag entry when the second drag entry is held at the position corresponding to the first application. Al for a predetermined time. Referring to Fig. 18b, upon receiving an input to select the second application A2 from the first, second and third displayed applications A1, A2 and A3, the execution screen A21 of the second application A2 may be displayed on the touch screen 151. That is, according to one embodiment of the present invention, applications corresponding to the generated notifications can be sequentially displayed on the touch screen 151 upon receipt of the second input slide starting at the second corner while the display unit is off. [0057] Referring again to Fig. 15, when no notification message is present even though the second slip input has been received (S230: No), the controller 180 may display recently captured images T1, T2, T3 and T4 in a virtual zone SA generated according to the second drag entry, as shown in Figure 19a. [0058] The size of the virtual zone SA increases as the length of the second drag entry increases and the virtual zone SA increased can display more images. Referring to Fig. 19b, when the second drag entry is released while the SA virtual zone is displayed, a GA gallery application may be executed and thus previously captured and stored images may be displayed on the touch screen 151 Referring to Fig. 19c, when the second slip entry starting at the second angle is reapplied while the run screen of the GA gallery application is displayed on the touch screen 151, the controller 180 may run an EA image editing application in addition to the GA Gallery application. Referring to Fig. 19d, when the second drag entry starting at the second angle is reapplied while a predetermined image is edited through the image editing application EA, the controller 180 may display a list 201 of sharing applications to share the edited image on the touch screen 151. According to the above description, according to one embodiment of the present invention, the execution of an application can be controlled when from the receipt of the second drag entry starting at the second angle of the touch screen 151. That is, it is possible to easily enter applications having predetermined depths through diagonal drag inputs being the same model. Fig. 20 is a flowchart illustrating a method of controlling a mobile terminal according to a third embodiment of the present invention, and Figs. 21A to 23B are views for explaining an exemplary implementation of the control method of a mobile terminal. mobile terminal according to the third embodiment of the present invention. The method of controlling a mobile terminal according to the third embodiment of the present invention can be implemented in the mobile terminal 100 described above with reference to Figures 1A to 1C. The method of controlling a mobile terminal according to the third embodiment of the present invention and the operations of the mobile terminal 100 to implement the method will be described below with reference to the accompanying drawings. The third embodiment of the present invention may be implemented based on the aforementioned first and second embodiments. In addition, the third embodiment may be combined with at least a portion of the first or second embodiment. With reference to FIG. 20, the third embodiment may be implemented while the display unit is on. The controller 180 may display a screen for executing a predetermined application on the touch screen 151 (S310) while the display unit is on (S300). In the third embodiment of the present invention, the first drag entry starting at the first angle or the second drag entry starting at the second angle may be received during the execution of the predetermined application. The controller 180 can perform different functions according to the first drag entry and the second drag entry. For example, the controller 180 may receive the first slip input starting at the first angle and applied in a diagonal direction (S321). The controller 180 may display information relating to the executed application (S323). Referring to Figs. 21A and 21B, when the first drag entry starting from the first angle and applied in a diagonal direction is received while an A4 web page is displayed on the touch screen 151, information 203 relating to the web page may be displayed on the touch screen 151. The web page information 203 may include a data capacity of a web application corresponding to the web page, a menu for deleting the web application, and so on. Referring to Fig. 21B, the controller 180 may delete the web application from the mobile terminal 100 when selecting the menu. Referring back to FIG. 20, the second drag entry starting at the second angle may be received while the application's run screen is displayed (S331). The controller 180 may display a most recently captured image in a virtual zone generated according to the second drag input (S333). With reference to FIG. 22a, the controller 180 receives the second slip input starting at the second angle and applied in a diagonal direction while an application execution screen A5 is displayed on the touch screen 151. The controller 180 generates a virtual zone SA and displays the virtual zone SA 20 on the touch screen 151 in response to the second drag entry. The size of the virtual zone SA can be changed depending on the length of the second drag entry. The controller 180 may display images stored in a gallery in the generated virtual zone SA. The aforementioned application may be a Mes senger application. Referring to Fig. 22b, the controller 180 may define the virtual area SA as the QS floating window of the gallery application and display images stored in the gallery on the QS floating window upon releasing the second drag entry after the display of the SA virtual zone. The QS floating window may be displayed partially overlapping an application previously executed and the position of the QS floating window may be changed by a user. Referring to Fig. 22C, upon receipt of a drag entry of a specific image 13 displayed on the QS floating window to an execution screen A4 of the Messenger application, the controller 180 may directly send the specific image 13 to a corresponding messenger counterpart. With reference to FIG. 22D, the execution screen of the Messenger application may comprise a message entry window 207. The controller 180 may attach the specific image 13 included in the QS floating window to the window of message entry 207 upon receipt of the drag entry from the specific image 13 to the message input window 207. With reference to Fig. 23A, the controller 180 receives the second slip input starting at the second angle and applied in a diagonal direction during the execution of the Messenger application. The controller 180 generates a virtual zone SA according to the second drag entry and displays in the virtual zone SA images stored in the gallery. When releasing the second input slip while the virtual zone SA is displayed, the controller 180 can display on the touch screen 151 a floating window QA displaying one or more applications, which can be executed in the mobile terminal 100. Here, the controller 180 can keep only the layout of the generated virtual zone SA and delete the images displayed in the virtual zone SA. Referring to Fig. 23B, upon receiving an input to select a specific application from the applications displayed in the floating window QA, the controller 180 may display the selected application within the layout of the virtual zone SA. The size of the SA virtual zone layout can be changed by a user. According to an embodiment of the present invention, when a drag entry starting at an angle of the display unit and applied in a diagonal direction is received while a specific screen is displayed on the display unit, a display change function of the specific screen can be executed. Fig. 24 is a flowchart illustrating a method of controlling a mobile terminal according to a fourth embodiment of the present invention and Figs. 25A to 26C are views for explaining an exemplary implementation of the control method of a mobile terminal according to the fourth embodiment of the present invention. The method of controlling a mobile terminal according to the fourth embodiment of the present invention may be implemented in the mobile terminal 100 described above with reference to Figs. 1A-1C. The method of controlling a mobile terminal according to the fourth embodiment of the present invention and operations of the mobile terminal 100 to implement the method will be described below with reference to the accompanying drawings. The fourth embodiment of the present invention may be implemented based on the above-mentioned first embodiment of the present invention. Referring to Fig. 24, the controller 180 of the mobile terminal 100 may display a lock screen in a lock mode (S400). The fourth embodiment of the present invention is described with the assumption that the mobile terminal 100 is in the state of the second lock mode. That is, the controller 180 displays the lock screen on the touch screen 151 while the mobile terminal 100 is in the state of the second lock mode. The controller 180 may receive a slip input starting at a specific angle of the touch screen 151 and applied in a diagonal direction while the lock screen is displayed on the touch screen 151 (S420). When the drag entry starts at the first angle, the controller 180 can execute the camera application (S431). When a predetermined image is captured using the camera application (S433), the controller 180 can set the captured image as a background image of the lock screen and display the image. background image (S435). When the drag entry starts at the second angle, the controller 180 controls the mobile terminal 100 to enter the gallery application (S441). The controller 180 may receive an input for selecting a predetermined image stored through the gallery application (S443). The controller 180 may set the selected image as a background image of the lock screen and display the background image (S445). With reference to FIG. 25A, the controller 180 may display the lock screen LS on the touch screen 151 while the mobile terminal 100 is in the second state lock mode. The state of the second lock mode corresponds to a state in which power is supplied to the touch screen 151 to allow the provision of predetermined information through the touch screen 151 and the lock mode can be released. according to a manipulation applied to the touch screen 151 or according to another predetermined manipulation, as has been described above. The controller 180 may display a preview image P on the touch screen 151 by executing the camera application upon receipt of the first drag entry starting at the first corner and applied in a diagonal direction on the lock screen. . After that, when a predetermined image is captured by execution of a capture function, the controller 180 may display the captured image as the background image 220 of the LS lock screen without a process. of setting a background image of the lock screen LS, as shown in Fig. 25B. Referring to Fig. 25C, when the first drag entry starting at the first angle is received again on the run screen of the camera application of Fig. 25A, the controller 180 may execute a different camera application of the aforementioned camera application, to capture an image. The different camera application may be a camera application having more functions than the previously executed camera application. For example, the different camera application may be a camera application having a filter effect. The captured image is displayed as a background image of the lock screen, as described above. Referring to Fig. 26A, the controller 180 may receive the second slip input starting at the second angle and applied in a diagonal direction while the lock screen LS is displayed on the touch screen 151. The controller 180 may display one or several images H, 12, ..., 17) stored in a first gallery on the touch screen 151 in response to the second drag entry. Referring to Fig. 26B, when selecting a specific one of the images H, 12, ...., 17, the controller 180 can set the selected image as a background image 227 of the LS lock screen and display the background image 227. Referring to Fig. 26C, upon a new reception of the second entry slid starting at the second corner while the images of the first gallery are displayed on the touch screen 151, the controller 180 can display images of a second gallery GA1 different from the first gallery. When a specific image is selected from the images of the second gallery GA1, the controller 180 can set the selected image as the background image 230 of the lock screen LS and display the image of BACKGROUND 230. Figs. 27 to 28C are views for explaining a method of controlling a mobile terminal according to a fifth embodiment of the present invention. Referring to Fig. 27, the controller 180 may receive a third drag input applied from the third angle of the touch screen 151 in a diagonal direction. The third angle C3 may correspond to the lower left corner of the touch screen 151. The controller 180 may display one or more application icons on the touch screen 151 depending on the length of the third drag entry. The application or applications may be recently executed applications. For example, when the user has recently used an A51 calendar application, an A52 web application, and an A53 camera application, the applications recently used by the user may be displayed along the drag path when receiving the third entry slip. When releasing the third input to a specific point, the controller 180 can control the mobile terminal 100 to directly enter the application corresponding to the specific point. With reference to FIG. to be a specific screen of a predetermined application. For example, a calendar application screen may be provided. Additionally, the controller 180 may input a predetermined schedule information input screen through the calendar application screen. Referring to Fig. 28B, when the third drag entry starting at the third angle is applied again to the schedule information input screen, the controller 180 may display a list of sharing applications for sharing information. The list of sharing applications can be displayed along the drag path of the third drag entry and a larger number of applications can be displayed depending on the length of the drag entry. [0059] The advantages of the mobile terminal and its control method according to the present invention will be described below. According to at least one embodiment of the present invention, it is possible to access a desired function more quickly by entering a predetermined pattern of tactile touches while the display unit is turned off. According to at least one embodiment of the present invention, it is possible to quickly capture an image at a desired instant by automatically operating a camera according to a predetermined drag entry starting at an angle of the touch screen while the unit of display is off. [0060] Those skilled in the art will appreciate that the present invention may be performed in other specific ways than those presented herein without departing from the spirit and essential features of the present invention. The aforementioned embodiments must therefore be interpreted in all their aspects as illustrative and not restrictive. The scope of the present invention should be determined by the appended claims and their legal equivalents, not by the description above, and all changes within the meaning and range of equivalence of the appended claims are intended to be included in these latest. Various embodiments may be implemented using a machine-readable medium on which instructions for execution by a processor are stored to perform various methods presented herein. Examples of machine-readable media include hard disks (HDD), electronic disks (SSD), silicon disk drives (SDD), ROMs, RAMs, CD-ROMs , a magnetic tape, a floppy disk, an optical data storage device, the other types of storage media presented herein and combinations thereof. If desired, the machine-readable medium may be embodied as a carrier wave (e.g., transmission over the Internet). The processor may comprise the controller 180 of the mobile terminal. [0061] The aforementioned embodiments are given primarily by way of example and should not be construed as limiting the present disclosure. The present teachings can be readily applied to other types of methods and apparatus. This description is intended to be illustrative and not to limit the scope of the claims. Many variations, modifications and variations will be obvious to those skilled in the art. The characteristic functions, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to provide examples of alternate or complementary embodiments. Since the present characteristic functions can be realized in several forms without departing from their characteristics, it should also be understood that the embodiments described above are not limited by any of the details of the foregoing description unless otherwise indicated, but should be avoided. rather, to consider them broadly within their scope as defined in the appended claims and, therefore, any changes and modifications that fall within the scope and limits of the claims, or equivalents of such boundaries and limits, are therefore made in intended to be included in the appended claims.
权利要求:
Claims (18) [0001] REVENDICATIONS1. A mobile terminal (100), comprising: a body; a camera (121); a touch screen (151) disposed on the front face of the body and having a plurality of angles; and a controller (180) configured to operate the camera (121) to capture an image upon receipt of a first drag input applied at a first angle C1 of the touch screen (151) and swiped to the center of the touch screen (151). [0002] The mobile terminal (100) of claim 1, wherein the first drag input is received while the touch screen (151) is off. [0003] The mobile terminal (100) of claim 2, wherein the controller (180) is configured to operate the camera (121) to capture an image when the first drag input is released. [0004] The mobile terminal (100) according to claim 2, wherein the first drag entry includes a sliding path toward the center of the touch screen (151) and includes a plurality of discontinuous slip inputs applied along the slip path. wherein the controller (180) is configured to control the image to be captured whenever the first drag input is discontinuous. [0005] A mobile terminal (100) according to claim 2, wherein, when the first drag input is held at a point on the drag input for a predetermined time, the controller (180) is configured to take a continuous shot while holding the first drag entry. [0006] The mobile telininal (100) of claim 2, wherein the controller (180) is configured to operate a first camera (121a) disposed on the front of the body to capture an image in response to the first drag input when the first angle C1 corresponds to an upper angle of the touch screen (151) and for actuating a second camera (121b) disposed on the rear face of the body to capture an image in response to the first input drag when the first Cl angle corresponds to a lower angle of the touch screen (151). [0007] The mobile terminal (100) according to claim 2, wherein when the first drag input is received while a preview image is displayed on the touch screen (151) as the camera (121) actuates, the controller (180) is configured to control the mobile terminal (100) to enter a two-camera mode by activating both a first camera (121a) disposed on the front of the body and a second camera (12 lb) on the back side of the body. [0008] The mobile terminal (100) of claim 2, wherein the controller (180) is configured to control whether a received notification message is present upon receipt of a second drag input applied at a second angle C2 of the touch screen (151) and swiped toward the center of the touch screen (151) and, when the received notification message is present, to display one or more application icons relating to the notification message received along a path sliding of the second entrance slip. [0009] The mobile terminal (100) according to claim 8, wherein one of the first angle C1 and the second angle C2 corresponds to one of the left and right angles of the touch screen (151) and the other angle corresponds to at the other corner of the touch screen (151). [0010] The mobile terminal (100) of claim 8, wherein the received notification message comprises the number of unread text messages and / or the number of unanswered calls and / or application update information. [0011] The mobile terminal (100) of claim 8, wherein the controller (180) is configured to execute a first application corresponding to a point at which the second drag input is released upon release of the second drag input. [0012] The mobile terminal (100) of claim 11, wherein, when the second drag input is received during the execution of the first application, the controller (180) is configured to execute a second application displayed along the slip path. . [0013] The mobile terminal (100) according to claim 8, wherein, when the second slip input is held at a specific point on the slip path for a predetermined time, the controller (180) is configured to maintain display of said one or more application icons. [0014] The mobile terminal (100) of claim 7, wherein the controller (180) is configured to display at least one newly captured image in a generated window in response to the second drag entry when the received notification message is not received. not present and to run a gallery application when releasing the second drag entry. 20 [0015] The mobile terminal (100) of claim 1, wherein the controller (180) is configured to display information relating to a specific application on the touch screen (151) upon receipt of the first drag input while a specific application run screen is displayed on the touch screen (151). 25 [0016] The mobile terminal (100) of claim 1, wherein the controller (180) is configured to display a lock screen corresponding to a lock mode on the touch screen (151) and when the first slip input is received while the lock screen is displayed, to set an image captured by actuating the camera (121) as a background image of the lock screen. [0017] The mobile terminal (100) of claim 1, wherein when a third drag input applied to a third angle C3 of the touch screen (151) and slid to the center of the touch screen (151) is received, the controller (180) is configured to display one or more application icons along a drag path of the third drag entry. [0018] 18. A method of controlling a mobile terminal (100), comprising the steps of: receiving a first drag input applied at a first angle C1 of a touch screen (151) having a plurality of angles and slid to the center of the touch screen (151); and actuating a camera (121) to capture an image upon releasing the first drag input.
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3029309A1|2016-06-03| US10564675B2|2020-02-18|Mobile terminal and control method therefor FR3026201A1|2016-03-25| FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME EP2999128B1|2018-10-24|Mobile terminal and control method therefor FR3022649A1|2015-12-25| FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3021425A1|2015-11-27| FR3022367A1|2015-12-18| FR3022648A1|2015-12-25| FR3021134A1|2015-11-20|MOBILE TERMINAL FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3019665A1|2015-10-09| FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME FR3040221A1|2017-02-24|
同族专利:
公开号 | 公开日 EP3026542B1|2019-06-26| US9927967B2|2018-03-27| EP3026542A1|2016-06-01| CN105653161A|2016-06-08| US20160154559A1|2016-06-02| CN105653161B|2020-09-15| KR20160063875A|2016-06-07| FR3029309B1|2018-12-07|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US8159469B2|2008-05-06|2012-04-17|Hewlett-Packard Development Company, L.P.|User interface for initiating activities in an electronic device| WO2010040670A2|2008-10-06|2010-04-15|Tat The Astonishing Tribe Ab|Method for application launch and system function invocation| KR101597553B1|2009-05-25|2016-02-25|엘지전자 주식회사|Function execution method and apparatus thereof| TWI441072B|2010-09-21|2014-06-11|Altek Corp|Touch screen unlock method and electric device with camera function thereof| TW201227393A|2010-12-31|2012-07-01|Acer Inc|Method for unlocking screen and executing application program| US20120180001A1|2011-01-06|2012-07-12|Research In Motion Limited|Electronic device and method of controlling same| EP2474894A1|2011-01-06|2012-07-11|Research In Motion Limited|Electronic device and method of controlling same| WO2013039046A1|2011-09-16|2013-03-21|Necカシオモバイルコミュニケーションズ株式会社|Information processing device having unlocking function| JP6132832B2|2012-03-26|2017-05-24|株式会社ザクティ|Electronic device, related information display method and program| CN102681774B|2012-04-06|2015-02-18|优视科技有限公司|Method and device for controlling application interface through gesture and mobile terminal| US8826178B1|2012-11-06|2014-09-02|Google Inc.|Element repositioning-based input assistance for presence-sensitive input devices| CN103049209B|2012-12-31|2016-04-06|广东欧珀移动通信有限公司|The method and apparatus of mobile phone camera is started under mobile phone puts out screen state| US9270889B2|2013-01-25|2016-02-23|Htc Corporation|Electronic device and camera switching method thereof| US20140218313A1|2013-02-07|2014-08-07|Kabushiki Kaisha Toshiba|Electronic apparatus, control method and storage medium| CN103402004A|2013-07-23|2013-11-20|广东欧珀移动通信有限公司|Method for pre-starting multiple cameras of mobile terminal| US9111076B2|2013-11-20|2015-08-18|Lg Electronics Inc.|Mobile terminal and control method thereof|CN105446631B|2014-08-19|2019-02-12|昆山纬绩资通有限公司|Electronic device and photographic means with camera function| US20160373388A1|2015-06-19|2016-12-22|Voxer Ip Llc|Messaging application for recording and inserting a video message into a chat| US20180152622A1|2015-12-01|2018-05-31|Huizhou Tcl Mobile Communication Co., Ltd|Mobile terminal-based photographing method and mobile terminal| KR20170085760A|2016-01-15|2017-07-25|삼성전자주식회사|Method for controlling camera device and electronic device thereof| JP6696327B2|2016-07-01|2020-05-20|富士ゼロックス株式会社|Information processing apparatus, image forming apparatus and program| CN106325671B|2016-08-16|2019-05-28|浙江翼信科技有限公司|A kind of method and apparatus replied message| EP3561651A1|2016-12-26|2019-10-30|Shenzhen Royole Technologies Co., Ltd.|Display screen control method and apparatus| US10956029B1|2018-06-08|2021-03-23|Facebook, Inc.|Gesture-based context switching between applications|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2017-05-05| PLSC| Search report ready|Effective date: 20170505 | 2017-05-30| PLFP| Fee payment|Year of fee payment: 3 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 4 | 2019-04-10| PLFP| Fee payment|Year of fee payment: 5 | 2020-04-08| PLFP| Fee payment|Year of fee payment: 6 | 2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020140167706|2014-11-27| KR1020140167706A|KR20160063875A|2014-11-27|2014-11-27|Mobile terminal and method for controlling the same| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|